Online learning platforms often rely on passive video lectures, which can be challenging for learners who experience attention difficulties, particularly neurodivergent learners. Many students struggle to maintain focus during long lectures and may miss important explanations without realizing it.
This paper presents NeuroNote, a mobile learning application designed to assist learners by monitoring attention during lecture playback and providing interactive learning support. The system detects user distraction using camera-based gaze monitoring and pauses the lecture when attention loss is detected. A flashcard summarizing the missed lecture segment is then displayed to reinforce understanding.
In addition, the application provides automatic lecture transcription and summary generation to help learners review key concepts efficiently. The system is implemented using Flutter for mobile development, SQLite for local data storage, and Supabase for optional cloud synchronization. The proposed system demonstrates how attention-aware learning tools can support neurodivergent learners and improve engagement in digital learning environments.
Introduction
The text presents NeuroNote, a mobile-based intelligent learning system designed to improve attention and learning retention during online video lectures, especially for neurodivergent learners who struggle with focus.
Traditional e-learning platforms are passive and do not detect when students lose attention, often forcing learners to rewatch entire lectures. NeuroNote addresses this gap by using real-time gaze tracking via the device camera to monitor user attention. When distraction is detected, the system automatically pauses the lecture and generates flashcards summarizing the missed content, helping learners quickly catch up.
In addition, the system uses speech-to-text transcription and summarization to generate structured lecture notes for later revision. This reduces cognitive load and improves knowledge retention.
The literature review highlights related work in eye-tracking, AI-based personalized learning, gesture recognition, and inclusive education systems, supporting the use of AI for adaptive learning.
The system workflow includes lecture playback, continuous attention monitoring, distraction detection, flashcard generation, and note creation. Implemented as a Flutter mobile app, NeuroNote integrates video playback, gaze detection, transcription, and learning assistance modules. Testing shows that it effectively detects distractions and helps users recover missed content in real time, improving engagement and learning efficiency.
Conclusion
This paper presented NeuroNote, an attention-aware mobile learning application designed to assist neurodivergent learners. The system integrates gaze monitoring, lecture transcription, flashcard generation, and note summarization to support learner engagement during video lectures. By detecting distraction and presenting recall flashcards, the system helps learners recover missed information efficiently. The results demonstrate the potential of attention-aware learning tools in improving digital learning experiences.
References
[1] Gupta, R. Mehta, and L. Chen, “Smartphone-based eye tracking system using edge intelligence and model optimisation,” IEEE Access, vol. 13, pp. 11245–11258, Jan. 2025.
[2] S. Nair and P. Verma, “GenAI for Neurodivergent Students: Enhancing Inclusive Learning through Adaptive Artificial Intelligence,” International Journal of Educational Technology and AI, vol. 7, no. 2, pp. 89–104, Jun. 2025.
[3] K. Yamamoto, J. Singh, and R. D’Souza, “Deep Learning for Hand Gestures: A Vision-Based Approach for Cognitive Accessibility,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 33, no. 1, pp. 45–56, Jan. 2025.
[4] L. Roberts and A. Bhattacharya, “LARF – Dyslexia Reading Framework using Artificial Intelligence for Early Detection,” Journal of Assistive Learning Technologies, vol. 9, no. 4, pp. 210–223, Apr. 2025.
[5] D. Kim and E. Patel, “AI Ethics: Trans & Neurodivergences in Machine Learning Systems,” in Proceedings of the IEEE Conference on Human-Centric AI (HCAI), pp. 155–162, Mar. 2025.
[6] S. K. D’Mello and A. Graesser, “Automatic Detection of Learner’s Affect from Gross Body Language,” Applied Artificial Intelligence, vol. 26, no. 1–2, pp. 28–48, 2012.